3 - Pattern Recognition (PR) [ID:2417]
50 von 407 angezeigt

[MUSIK]

the following content has been provided by the University of

belong Erlangen Nürnberg good afternoon everybody I'm sorry for the delay the start but

today we changed the lecture hall and that's the reason why we got some

kind of confusion but in the future Monday's we will meet at 3 PM sharp in

this room and tomorrow is Andre here please check the web where the

meeting or the lecture will be tomorrow morning at 10.15 or 10.30

okay before I continue in the text let me use a

few seconds to bring you back to the storyline that we have started to discuss

last week in the lecture pattern recognition we basically talk about classification

classification is nothing else but reading a

feature vector X and mapping the feature vector

to a class in XY so this is

a class in X the discrete number

and this here is a real

valued D dimensional feature vector of

last week I also pointed out what the difference is

between classification and regression regression is something that you

have seen in engineering mathematics already this is basically

nothing else but Y is some kind of delta

X where this is no longer a class index but a real value

and as an example I have shown to you last week an affine function but

you can also think of having something like a 0 plus

A1 A1 plus A2 X2 plus let's

say AD XD that's a linear manifold in

the D dimensional space and the computation of

the AI's for instance is the regression problem

and defines the data so these are two

things we have to consider and during the lecture this some kind of parallelism between

classification and regression is something that we are going to consider several times

while introducing certain classifiers I will also point out how you can use the

theoretical framework to solve for regression problems we will also see tons of different regression

problems within the lecture so when you hear classification just called subroutine

in your brain telling you the feature vector is mapped to a class index if you hear regression just

remember regression is nothing else but a mapping of

a feature vector to a real valued number so these are the two things we are

considering and in the lecture on pattern recognition we basically will

discuss different ways of designing the decision function data so

the problem we are considering is how to the design

the decision function for a particular scenario and we have seen last

week that we can distinguish between supervised and unsupervised learning

which is nothing else but in the supervised case we get feature vectors and

the assigned class number Y and in the unsupervised case we just observe features

and nothing about the structure of classes and then we have to learn the

decision function data based on

these data sets and we have also seen last week that we will make use of

probability theory quite a lot so we will

talk about Bayesian classifiers first and we have seen the

decision rule of the Bayesian classifier which is something that you should know by

heart after the lecture series the Bayesian classifier applies the decision

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:39:41 Min

Aufnahmedatum

2012-10-22

Hochgeladen am

2012-10-26 09:29:05

Sprache

en-US

Einbetten
Wordpress FAU Plugin
iFrame
Teilen